Distributed Approximation of Joint Measurement Distributions Using Mixtures of Gaussians
نویسندگان
چکیده
This paper presents an approach to distributively approximate the continuous probability distribution that describes the fusion of sensor measurements from many networked robots. Each robot forms a weighted mixture of Gaussians to represent the measurement distribution of its local observation. From this mixture set, the robot then draws samples of Gaussian elements to enable the use of a consensus-based algorithm that evolves the corresponding canonical parameters. We show that the these evolved parameters describe a distribution that converges weakly to the joint of all the robots’ unweighted mixture distributions, which itself converges weakly to the joint measurement distribution as more system resources are allocated. The major innovation of this approach is to combine sample-based sensor fusion with the notion of pre-convergence termination that results in scalable multi-robot system. We also derive bounds and convergence rates for the approximated joint measurement distribution, specifically the elements of its information vectors and the eigenvalues of its information matrices. Most importantly, these performance guarantees do not come at a cost of complexity, since computational and communication complexity scales quadratically with respect to the Gaussian dimension, linearly with respect to the number of samples, and constant with respect to the number of robots. Results from numerical simulations for object localization are discussed using both Gaussians and mixtures of Gaussians.
منابع مشابه
Stochastic approximation for background modelling
Many background modelling approaches are based on mixtures of multivariate Gaussians with diagonal covariance matrices. This often yields good results, but complex backgrounds are not adequately captured, and postprocessing techniques are needed. Here we propose the use of mixtures of uniform distributions and multivariate Gaussians with full covariance matrices. These mixtures are able to cope...
متن کاملTraining Mixture Models at Scale via Coresets
How can we train a statistical mixture model on a massive data set? In this paper, we show how to construct coresets for mixtures of Gaussians and natural generalizations. A coreset is a weighted subset of the data, which guarantees that models fitting the coreset also provide a good fit for the original data set. We show that, perhaps surprisingly, Gaussian mixtures admit coresets of size poly...
متن کاملMixtures of Gaussians
This tutorial treats mixtures of Gaussian probability distribution functions. Gaussian mixtures are combinations of a finite number of Gaussian distributions. They are used to model complex multidimensional distributions. When there is a need to learn the parameters of the Gaussian mixture, the EM algorithm is used. In the second part of this tutorial mixtures of Gaussian are used to model the ...
متن کاملMixtures of Gaussians 1
This tutorial treats mixtures of Gaussian probability distribution functions. Gaussian mixtures are combinations of a finite number of Gaussian distributions. They are used to model complex multi-dimensional distributions. When there is a need to learn the parameters of the Gaussian mixture, the EM algorithm is used. In the second part of this tutorial mixtures of Gaussian are used to model the...
متن کاملProduct of Gaussians as a distributed representation for speech recognition
Distributed representations allow the effective number of Gaussian components in a mixture model, or state of an HMM, to be increased without dramatically increasing the number of model parameters. Various forms of distributed representation have previously been investigated. In this work it shown that the product of experts (PoE) framework may be viewed as a distributed representation when the...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2012